Semi-feature Level Fusion for Bimodal Affect Regression Based on Facial and Bodily Expressions

نویسندگان

  • Yang Zhang
  • Li Zhang
چکیده

Automatic emotion recognition has been widely studied and applied to various computer vision tasks (e.g. health monitoring, driver state surveillance, personalized learning, and security monitoring). As revealed by recent psychological and behavioral research, facial expressions are good in communicating categorical emotions (e.g. happy, sad, surprise, etc.), while bodily expressions could contribute more to the perception of dimensional emotional states (e.g. arousal and valence). In this paper, we propose a semi-feature level fusion framework that incorporates affective information of both the facial and bodily modalities to draw a more reliable interpretation of users’ emotional states in a valence–arousal space. The Genetic Algorithm is also applied to conduct automatic feature optimization. We subsequently propose an ensemble regression model to robustly predict users’ continuous affective dimensions in the valence–arousal space. The empirical findings indicate that by combining the optimal discriminative bodily features and the derived Action Unit intensities as inputs, the proposed system with adaptive ensemble regressors achieves the best performance for the regression of both the arousal and valence dimensions.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Intelligent emotion recognition from facial and whole-body expressions using adaptive ensemble models

Automatic emotion recognition has been widely studied and applied to various computer vision tasks (e.g. health monitoring, driver state surveillance, personalized learning, and security monitoring). With the great potential provided by current advanced 3D scanners technology (e.g. the Kinect), we shed light on robust emotion recognition based one users’ facial and whole-body expressions. As re...

متن کامل

Analysis and Synthesis of Facial Expressions by Feature-Points Tracking and Deformable Model

Face expression recognition is useful for designing new interactive devices offering the possibility of new ways for human to interact with computer systems. In this paper we develop a facial expressions analysis and synthesis system. The analysis part of the system is based on the facial features extracted from facial feature points (FFP) in frontal image sequences. Selected facial feature poi...

متن کامل

A New Information Fusion Method for Bimodal Robotic Emotion Recognition

Emotion recognition has become a popular area in human-robot interaction research. Through recognizing facial expressions, a robot can interact with a person in a more friendly manner. In this paper, we proposed a bimodal emotion recognition system by combining image and speech signals. A novel probabilistic strategy has been studied for a support vector machine (SVM)-based classification desig...

متن کامل

Synthesis of human facial expressions based on the distribution of elastic force applied by control points

Facial expressions play an essential role in delivering emotions. Thus facial expression synthesis gain interests in many fields such as computer vision and graphics. Facial actions are generated by contraction and relaxation of the muscles innervated by facial nerves. The combination of those muscle motions is numerous. therefore, facial expressions are often person specific. But in general, f...

متن کامل

Fusion of facial expressions and EEG for implicit affective tagging

a r t i c l e i n f o The explosion of user-generated, untagged multimedia data in recent years, generates a strong need for efficient search and retrieval of this data. The predominant method for content-based tagging is through slow, labor-intensive manual annotation. Consequently, automatic tagging is currently a subject of intensive research. However, it is clear that the process will not b...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015